Agenda

Procesamiento de Texto

Ver archivo Python

Untitled2
In [1]:
import pandas as pd
import urllib
import numpy as np
import urllib.request
import re
from textblob import TextBlob
%run lib.py
In [2]:
#name="Legally%20Blonde"
#name="aboutmary"
#name="10Things"
name="magnolia"
#name="Friday%20The%2013th"
#name="Ghost%20Ship"
#name="Juno"
#name="Reservoir+Dogs"
#name="shawshank"
#name="Sixth%20Sense,%20The"
#name="sunset_bld_3_21_49"
#name="Titanic"
#name="toy_story"
#name="trainspotting"
#name="transformers"
#name="the-truman-show_shooting"
#name="batman_production"
In [3]:
ext="html"
txtfiles=["Ghost%20Ship", "Legally%20Blonde", "Friday%20The%2013th", "Juno", "Reservoir+Dogs", "Sixth%20Sense,%20The", "Titanic"]
if name in txtfiles:
    ext="txt"
fp = urllib.request.urlopen("http://www.dailyscript.com/scripts/"+name+"."+ext)
mybytes = fp.read()

mystr = mybytes.decode("utf8", "ignore")
fp.close()
liston=mystr.split("\n")
liston=[s.replace('\r', '') for s in liston]
liston=[re.sub('<[^<]+?>', '', text) for text in liston]
In [4]:
if name=="shawshank":
    liston=[i.replace("\t", "    ") for i in liston]
In [5]:
char=""
script=[]
charintro='                                 '
endofdialogue='          '
dialoguepre='                    '
newscenepre='          '
charintro=''
endofdialogue=''
dialoguepre=''
newscenepre=''
i=45
print("Characters")
i, charintro=nextbigchunk(liston, i)
print("Adverbs")
i, adverb=nextbigchunk(liston, i, adverbs=True)
print("Dialogues")
i, dialoguepre=nextbigchunk(liston, i)
print("New Scene:")
i, newscenepre=nextbigchunk(liston, i)

if newscenepre=="X":
    i=100
    i, newscenepre=nextbigchunk(liston, i)
    if name=="aboutmary":
        newscenepre=" ".join(["" for i in range(56)])
    if len(newscenepre)==len(charintro):
        newscenepre="X"
    

endofdialogue=newscenepre
    

scene=1
for s in liston:
    if s[0:len(charintro)]==charintro and s[len(charintro)]!=" " and s.strip()[0]!="(" and s.strip()[len(s.strip())-1]!=")":
        #print("Charatcer*****")
        char=s[len(charintro):]
        new=dict()
        new['char']=char.strip()
        new['dialogue']=""
        new['scene']=scene
        new['adverb']=""
    if s==endofdialogue or s.replace(" ", "")=="":
        if char!="":
            char=""
            script.append(new)
    if char!="" and s[0:len(dialoguepre)]==dialoguepre and s[len(dialoguepre)]!=" ":
        #print("Dialogue******")
        if new['dialogue']!="":
            new['dialogue']=new['dialogue']+" "
        new['dialogue']=new['dialogue']+s[len(dialoguepre):]
    if char!="" and ((s[0:len(adverb)]==adverb and s[len(adverb)]!=" ") or (len(s)>1 and s.strip()[0]=="(" and s.strip()[len(s.strip())-1]==")" )):
        if new['adverb']!="":
            new['adverb']=new['adverb']+" "
        new['adverb']=new['adverb']+s[len(adverb):]
    if s[0:len(newscenepre)]==newscenepre and len(s)>len(newscenepre) and ( s.isupper()) and s[len(newscenepre)]!=" ":
        scene=scene+1
Characters
                                magnolia
                                NARRATOR
                                NARRATOR
                                NARRATOR
                                NARRATOR
                                NARRATOR
Adverbs
Dialogues
                      In the New York Herald, November 26,
                      year 1911, there is an account of the
                      hanging of three men --
                      ...they died for the murder of
                      Sir Edmund William Godfrey --
                      -- Husband, Father, Pharmacist and all
New Scene:
     a P.T. Anderson picture                             11/10/98
     a Joanne Sellar/Ghoulardi Film Company production
     
     
     
     
In [6]:
pd.DataFrame(script).to_csv(name+'.csv', index=None)
pd.DataFrame(script)
Out[6]:
adverb char dialogue scene
0 magnolia 1
1 NARRATOR In the New York Herald, November 26, year 1911... 2
2 NARRATOR ...they died for the murder of Sir Edmund Will... 2
3 NARRATOR -- Husband, Father, Pharmacist and all around ... 2
4 NARRATOR Greenberry Hill, London. Population as listed. 3
5 NARRATOR He was murdered by three vagrants whose motive... 5
6 NARRATOR ...Joseph Green..... 5
7 NARRATOR ...Stanley Berry.... 5
8 NARRATOR ...and Nigel Hill... 5
9 NARRATOR Green, Berry and Hill. 7
10 NARRATOR ...And I Would Like To Think This Was Only A M... 7
11 NARRATOR As reported in the Reno Gazzette, June of 1983... 9
12 NARRATOR --- the water that it took to contain the fire -- 10
13 NARRATOR -- and a scuba diver named Delmer Darion. 12
14 NARRATOR Employee of the Peppermill Hotel and Casino, R... 15
15 NARRATOR -- well liked and well regarded as a physical,... 16
16 NARRATOR -- as reported by the coroner, Delmer died of ... 21
17 NARRATOR ...volunteer firefighter, estranged father of ... 24
18 NARRATOR -- added to this, Mr. Hansen's tortured life m... 26
19 CRAIG HANSEN ...oh God...fuck...I'm sorry...I'm sorry... 27
20 NARRATOR The weight of the guilt and the measure of coi... 27
21 CRAIG HANSEN ...forgive me... 27
22 NARRATOR And I Am Trying To Think This Was All Only A M... 29
23 NARRATOR The tale told at a 1961 awards dinner for the ... 32
24 NARRATOR Seventeen year old Sydney Barringer. In the ci... 33
25 NARRATOR The coroner ruled that the unsuccessful suicid... 33
26 NARRATOR The suicide was confirmed by a note, left in t... 34
27 NARRATOR At the same time young Sydney stood on the le... 35
28 NARRATOR The neighbors heard, as they usually did, the... 36
29 NARRATOR -- and it was not uncommon for them to threat... 37
... ... ... ... ...
1493 DIXON We gotta get his money so we can get outta her... 382
1494 WORM That idea is over now. We're not gonna do tha... 382
1495 (to Stanley) DIXON DADDY, FUCK, DADDY, DON'T GET MAD AT ME. DON'T... 382
1496 WORM I'm not mad, son, I will not be mad at you an... 382
1497 DIXON DAD. 382
1498 DIXON I - just - thought - that - I - didn't want - ... 382
1499 WORM It's ok, boy. 382
1500 MUSIC/KERMIT THE FROG "It's not that easy bein' green... Having to s... 383
1501 DONNIE My teeff...my teeef.... 385
1502 JIM KURRING YOU'RE OK...you're gonna be ok.... 385
1503 NARRATOR And there is the account of the hanging of thr... 390
1504 NARRATOR There are stories of coincidence and chance an... 391
1505 NARRATOR ...and we generally say, "Well if that was in... 392
1506 DOCTOR Are you with us? Linda? Is it Linda? 394
1507 NARRATOR Someone's so and so meet someone else's so and... 395
1508 NARRATOR And it is in the humble opinion of this narrat... 398
1509 STANLEY Dad...Dad. 399
1510 STANLEY You have to be nicer to me, Dad. 399
1511 RICK Go to bed. 399
1512 STANLEY I think that you have to be nicer to me. 399
1513 RICK Go to bed. 399
1514 NARRATOR ...and so it goes and so it goes and the book... 400
1515 MARCIE I killed him. I killed my husband. He hit my... 401
1516 DONNIE I know that I did a thtupid thing. Tho-thtupid... 402
1517 DONNIE I really do hath love to give, I juth don't kn... 402
1518 JIM KURRING ...these security systems can be a real joke. ... 403
1519 DONNIE ....ohh-thur-I-thur-thill.... 403
1520 JIM KURRING You guys make alotta money, huh? 403
1521 (beat) JIM KURRING ...alot of people think this is just a job tha... 405
1522 END. 406

1523 rows × 4 columns

In [7]:
magnolia=pd.read_csv(name+'.csv')
stopwords = getstopwords()
In [8]:
removedchars=["'S VOICE", "'S WHISPER VOICE", " GATOR"]
for s in removedchars:
    magnolia['char']=magnolia['char'].apply(lambda x: x.replace(s, ""))
i=0
scenes=dict()
for s in magnolia.iterrows():
    scenes[s[1]['scene']]=[]
for s in magnolia.iterrows():
    scenes[s[1]['scene']].append(s[1]['char'])
for s in magnolia.iterrows():
    scenes[s[1]['scene']]=list(set(scenes[s[1]['scene']]))
In [9]:
characters=[]
for s in scenes:
    for k in scenes[s]:
        characters.append(k)
characters=list(set(characters))
appearances=dict()
for s in characters:
    appearances[s]=0
for s in magnolia.iterrows():
    appearances[s[1]['char']]=appearances[s[1]['char']]+1
In [10]:
a=pd.DataFrame(appearances, index=[i for i in range(len(appearances))])
In [11]:
finalcharacters=[]
for s in pd.DataFrame(a.transpose()[0].sort_values(0, ascending=False))[0:10].iterrows():
    finalcharacters.append(s[0])
In [12]:
finalcharacters
file=open(name+"_nodes.csv", "w")
couplesappearances=dict()
for s in finalcharacters:
    file.write(";")
    file.write(s)
file.write("\n")
for s in finalcharacters:
    newlist=[]
    for f in finalcharacters:
        newlist.append(0)
        couplesappearances[f+"_"+s]=0
    j=0
    for f in finalcharacters:
        for p in scenes:
            if f in scenes[p] and s in scenes[p] and f!=s and finalcharacters.index(f)<finalcharacters.index(s): 
                long=len(magnolia[magnolia["scene"]==p])
                newlist[j]=newlist[j]+long
                couplesappearances[f+"_"+s]=couplesappearances[f+"_"+s]+long
        j=j+1
    file.write(s)
    for f in newlist:
        file.write(";")
        file.write(str(f))
    file.write("\n")
file.close()
In [13]:
a=pd.DataFrame(couplesappearances, index=[i for i in range(len(couplesappearances))])
finalcouples=[]
for s in pd.DataFrame(a.transpose()[0].sort_values(0, ascending=False))[0:4].iterrows():
    finalcouples.append(s[0])
In [14]:
file=open(name+"_finalcharacters.csv", "w")
for s in finalcharacters:
    file.write(s+"\n")
file.close()
file=open(name+"_finalcouples.csv", "w")
for s in finalcouples:
    file.write(s+"\n")
file.close()
In [15]:
importantchars=[]
for char in appearances:
    if appearances[char]>10:
        importantchars.append(char)
In [16]:
file=open(name+"_sentiment_overtime_individual.csv", "w")
file2=open(name+"_sentiment_overtime_individualminsmaxs.csv", "w")

for k in finalcharacters:
    print(k)
    dd=getdialogue(magnolia, k, k, scenes)
    dd=[str(d) for d in dd]
    polarities, subjectivities=getsentiment(dd)
    %matplotlib inline
    import matplotlib.pyplot as plt
    moveda=maverage(polarities, dd, .99)
    plt.plot(moveda)
    i=0
    for s in moveda:
        file.write(k+","+str(float(i)/len(moveda))+", "+str(s)+"\n")
        i=i+1
    plt.ylabel('polarities')
    plt.show()
    file2.write(k+"| MIN| "+dd[moveda.index(np.min(moveda))]+"\n")
    file2.write(k+"| MAX| "+dd[moveda.index(np.max(moveda))]+"\n")
    print("MIN: "+dd[moveda.index(np.min(moveda))])
    print("\n")
    print("MAX: "+dd[moveda.index(np.max(moveda))])
    
file.close()
file2.close()

file=open(name+"_sentiment_overtime_couples.csv", "w")
file2=open(name+"_sentiment_overtime_couplesminsmaxs.csv", "w")

for k in finalcouples:
    print(k)
    liston=k.split("_")
    dd=getdialogue(magnolia, liston[0], liston[1], scenes)
    dd=[str(d) for d in dd]
    polarities, subjectivities=getsentiment(dd)
    %matplotlib inline
    import matplotlib.pyplot as plt
    moveda=maverage(polarities, dd, .99)
    plt.plot(moveda)
    i=0
    for s in moveda:
        file.write(k+","+str(float(i)/len(moveda))+", "+str(s)+"\n")
        i=i+1
    plt.ylabel('polarities')
    plt.show()
    file2.write(k+"| MIN| "+dd[moveda.index(np.min(moveda))]+"\n")
    file2.write(k+"| MAX| "+dd[moveda.index(np.max(moveda))]+"\n")
    print("MIN: "+dd[moveda.index(np.min(moveda))])
    print("\n")
    print("MAX: "+dd[moveda.index(np.max(moveda))])
    
file.close()
file2.close()
JIM KURRING
MIN: You mind if I check things back here? 


MAX: YOU'RE OK...you're gonna be ok....
JIMMY
MIN: She went crazy.  She went crazy, Rose. 


MAX: Imagine you are attending a jam session of classical composers and they have  each done an arrangment of the classic  favorite, "Whispering."  Here are three  variations on the theme, as three classic  composer's might have written it -- you are to name the composer.  The First: 
CLAUDIA
MIN: I'm sorry. 


MAX: Did you ever go out with someone and just....lie....question after question, maybe you're trying to  make yourself look cool or better  than you are or whatever, or smarter  or cooler and you just -- not really lie, but maybe you just don't say everything --
FRANK
MIN: If you feel, made to feel like you need them, like -- like you can't live if you're without them or you need, what?  They're pussy?  They're love? Fuck that.  Self Sufficient, gents.  That's the truth. What you are -- we are -- you need them  for what?  To fucking make you a piece of snot rag?  A puppett?  huh?  Hear them bitch and moan? bitch and moan --  and we're taught one thing -- go the other way -- there is No Excuse I will give you, I'm not gonna apologize -- I'm not gonna  apologize for my NEED my DESIRE...my, the  things that I need as a man to feel comfortable... You understand?  You understand?  You need to say something, "my mommy hit me or  daddy hit me or didn't let me play soccer,  so now I make mistakes, cause a that -- something, so now I piss and shit on it and do this." Bullshit.  I'm sorry. ok. yeah. no. fuck.  go.  fuck. alright. go make a new mistake. maybe not, I dunno...fuck.... 


MAX: I wouldn't want that to be misunderstood: My enrollment was totally unoffical because I was, sadly, unable to afford tuition up  there.  But there were three wonderful men who were kind enough to let me sit in on their classes, and they're names are:  Macready, Horn and Langtree among others. I was completely independent financially, and like I said: One Sad Sack A Shit.  So what we're looking at here is a true rags to riches story and I think that's  what most people respond to in "Seduce," And At The End Of The Day? Hey -- it may not  even be about picking up chicks and sticking your cock in it -- it's about finding What You Can Be In This World.  Defining It.  Controling It and  saying: I will take what is mine.  You just happen  to get a blow job out of it, then hey-what-the-fuck- why-not?  he.he.he.
PHIL
MIN: You wanna call him on the phone? We can call him, I can dial the  phone if you can remember the number -- 


MAX: Thank you, Chad, and good luck to you and your mother -- 
STANLEY
MIN: I think that you have to be nicer to me.


MAX: I'm fine. I'm fine, I just wanna keep playing --
DONNIE
MIN: My teeff...my teeef....


MAX: My name is Donnie Smith and I have lot's of love to give. 
EARL
MIN: No, no, the grade...the grade that you're in? 


MAX: "...it's not going to stop 'till you wise up..."
LINDA
MIN: listen...listen to me now, Phil:  I'm sorry, sorry I slapped your face.  ...because I don't know what I'm doing... ...I don't know how to do this, y'know?  You understand?  y'know?  I...I'm...I do things  and I fuck up and I fucked up....forgive me, ok? Can you just...


MAX: I'm listening.  I'm getting better. 
NARRATOR
MIN: -- added to this, Mr. Hansen's tortured life met before with Delmer Darion just two nights previous --


MAX: So Fay Barringer was charged with the  murder of her son and Sydney Barringer  noted as an accomplice in his own death...
JIM KURRING_CLAUDIA
MIN: You mind if I check things back here? 


MAX: ok. 
JIMMY_STANLEY
MIN: I don't mean to cry, I'm sorry. 


MAX: Imagine you are attending a jam session of classical composers and they have  each done an arrangment of the classic  favorite, "Whispering."  Here are three  variations on the theme, as three classic  composer's might have written it -- you are to name the composer.  The First: 
PHIL_EARL
MIN: -- it's not him. it's not him. He's the fuckin' asshole...Phil..c'mere... 


MAX: ...ah...maybe...yeah...she's a good one... 
FRANK_PHIL
MIN: When they put me on hold, to  talk to you...they play the tapes.  I mean: I'd seen the commercials and heard about you, but I'd never heard the tapes ....


MAX: I just...he was...but I gave him,  I just had to give him a small dose of  liquid morphine.  He hasn't been able to swallow the morphine pills so we now,  I just had to go to the liquid morphine... For the pain, you understand? 
In [17]:
for key, val in scenes.items():
    for s in scenes[key]:
        new="INSCENE_"+scenes[key][0]
        scenes[key].remove(scenes[key][0])
        scenes[key].append(new)
In [18]:
magnolia.dropna(subset=['dialogue'])
1
Out[18]:
1
In [19]:
baskets=[]
spchars=["\"", "'", ".", ",", "-"]
attributes=["?", "!"]
for s in magnolia.iterrows():
    if type(s[1]['dialogue'])!=float and  len(s[1]['dialogue'])>0:
        new=[]
        for k in scenes[s[1]['scene']]:
            new.append(k)
        new.append("SPEAKING_"+s[1]['char'])
        for k in s[1]['dialogue'].split(" "):
            ko=k
            for t in spchars:
                ko=ko.replace(t, "")
            for t in attributes:
                if ko.find(t)>=0:
                    new.append(t)
                    ko=ko.replace(t, "")
            if len(ko)>0:
                new.append(ko.lower())
        new=list(set(new))
        baskets.append(new)
In [20]:
baskets2=[]
basketslist=[]
for k in baskets:
    new=dict()
    new2=[]
    for t in k:
        if t not in stopwords:
            new[t]=1
            new2.append(t)
    baskets2.append(new)
    basketslist.append(new2)
In [21]:
baskets2=pd.DataFrame(baskets2)
from mlxtend.frequent_patterns import apriori
from mlxtend.frequent_patterns import association_rules
baskets2=baskets2.fillna(0)
baskets2.to_csv(name+'_basket.csv')
In [22]:
frequent_itemsets = apriori(baskets2, min_support=5/len(baskets2), use_colnames=True)
rules = association_rules(frequent_itemsets, metric="lift", min_threshold=1)
In [23]:
rules['one_lower']=[int(alllower(i) or alllower(j)) for i, j in zip(rules['antecedants'], rules['consequents'])]
In [24]:
rules['both_lower']=[int(alllower(i) and alllower(j)) for i, j in zip(rules['antecedants'], rules['consequents'])]
In [25]:
rules.to_csv(name+'_rules.csv', index=None)

Analisis de Sentimiento (Pelicula & Personaje)

Score por Pelicula

Titulo
.
SICK BOY
Numero de Palabras/Tokens en el texto original
Palabras Distintas
1268
Escala de Sentimientos entre negativos y positivos: afinn
Descripcion Score % Founded Words
Entre 0 (negativo) y 10 (positivo) 4.576772 12.3%
Porcentaje de Palabras encontradas por tipo de sentimiento (bing) 14.8%
sentiment Porcentaje
positive 50.302%
negative 49.698%
Porcentaje de Palabras encontradas por tipo de sentimiento (nrc) 22%
sentiment Porcentaje
positive 17.3%
negative 15.7%
trust 11.7%
anticipation 10.8%
joy 9.4%
sadness 8.5%
fear 7.1%
surprise 6.7%
anger 6.4%
disgust 6.4%
Porcentaje de Palabras encontradas por tipo de sentimiento (loughran) 6.15%
sentiment Porcentaje
negative 43.3%
positive 34.8%
uncertainty 15.4%
litigious 4.5%
constraining 1.5%
superfluous 0.5%

Score por Personaje

[1] “Analisis de Sentimientos del Personaje: RENTON” [1] “Numero total de Palabras Unicas en el texto: 494”

Escala de Sentimientos entre negativos y positivos: afinn
Descripcion Score % Founded Words
Entre 0 (negativo) y 10 (positivo) 4.904 12.1%
Porcentaje de Palabras encontradas por tipo de sentimiento ( bing ) 14.4%
sentiment Porcentaje
positive 53.39%
negative 46.61%
Porcentaje de Palabras encontradas por tipo de sentimiento ( nrc ) 17.2%
sentiment Porcentaje
positive 17.8%
negative 14.8%
trust 11.1%
anticipation 10.4%
joy 9.4%
sadness 8.8%
surprise 7.7%
disgust 7.4%
fear 6.4%
anger 6.1%
Porcentaje de Palabras encontradas por tipo de sentimiento ( loughran ) 4.86%
sentiment Porcentaje
positive 44.2%
negative 37.2%
uncertainty 18.6%

[1] “Analisis de Sentimientos del Personaje: SICK BOY” [1] “Numero total de Palabras Unicas en el texto: 469”

Escala de Sentimientos entre negativos y positivos: afinn
Descripcion Score % Founded Words
Entre 0 (negativo) y 10 (positivo) 4.8875 9.38%
Porcentaje de Palabras encontradas por tipo de sentimiento ( bing ) 11.9%
sentiment Porcentaje
positive 56.4%
negative 43.6%
Porcentaje de Palabras encontradas por tipo de sentimiento ( nrc ) 13.9%
sentiment Porcentaje
positive 20.7%
anticipation 12.7%
negative 12.7%
trust 12.2%
joy 10.8%
sadness 7.0%
fear 6.6%
anger 6.1%
surprise 6.1%
disgust 5.2%
Porcentaje de Palabras encontradas por tipo de sentimiento ( loughran ) 4.9%
sentiment Porcentaje
positive 56.4%
negative 33.3%
uncertainty 7.7%
constraining 2.6%

[1] “Analisis de Sentimientos del Personaje: BEGBIE” [1] “Numero total de Palabras Unicas en el texto: 238”

Escala de Sentimientos entre negativos y positivos: afinn
Descripcion Score % Founded Words
Entre 0 (negativo) y 10 (positivo) 2.75 11.8%
Porcentaje de Palabras encontradas por tipo de sentimiento ( bing ) 13%
sentiment Porcentaje
negative 78.6%
positive 21.4%
Porcentaje de Palabras encontradas por tipo de sentimiento ( nrc ) 18.9%
sentiment Porcentaje
negative 25.9%
disgust 14.4%
sadness 12.9%
fear 10.1%
anger 8.6%
positive 7.9%
trust 6.5%
surprise 5.8%
joy 4.3%
anticipation 3.6%
Porcentaje de Palabras encontradas por tipo de sentimiento ( loughran ) 4.62%
sentiment Porcentaje
negative 60.0%
positive 20.0%
litigious 13.3%
uncertainty 6.7%

[1] “Analisis de Sentimientos del Personaje: SPUD” [1] “Numero total de Palabras Unicas en el texto: 299”

Escala de Sentimientos entre negativos y positivos: afinn
Descripcion Score % Founded Words
Entre 0 (negativo) y 10 (positivo) 5.521127 12.7%
Porcentaje de Palabras encontradas por tipo de sentimiento ( bing ) 9.7%
sentiment Porcentaje
positive 71.2%
negative 28.8%
Porcentaje de Palabras encontradas por tipo de sentimiento ( nrc ) 12.7%
sentiment Porcentaje
positive 16.3%
trust 14.1%
negative 13.3%
anticipation 11.9%
joy 9.6%
fear 8.9%
surprise 8.9%
sadness 7.4%
anger 5.2%
disgust 4.4%
Porcentaje de Palabras encontradas por tipo de sentimiento ( loughran ) 5.69%
sentiment Porcentaje
negative 34.6%
positive 34.6%
uncertainty 26.9%
litigious 3.8%

[1] “Analisis de Sentimientos del Personaje: DIANE” [1] “Numero total de Palabras Unicas en el texto: 187”

Escala de Sentimientos entre negativos y positivos: afinn
Descripcion Score % Founded Words
Entre 0 (negativo) y 10 (positivo) 6.047619 10.2%
Porcentaje de Palabras encontradas por tipo de sentimiento ( bing ) 12.8%
sentiment Porcentaje
positive 75.9%
negative 24.1%
Porcentaje de Palabras encontradas por tipo de sentimiento ( nrc ) 18.7%
sentiment Porcentaje
positive 21.7%
negative 14.1%
joy 13.0%
trust 13.0%
surprise 8.7%
fear 7.6%
sadness 6.5%
anger 5.4%
anticipation 5.4%
disgust 4.3%
Porcentaje de Palabras encontradas por tipo de sentimiento ( loughran ) 4.28%
sentiment Porcentaje
negative 50.0%
positive 25.0%
litigious 12.5%
uncertainty 12.5%

[1] “Analisis de Sentimientos del Personaje: TOMMY” [1] “Numero total de Palabras Unicas en el texto: 179”

Escala de Sentimientos entre negativos y positivos: afinn
Descripcion Score % Founded Words
Entre 0 (negativo) y 10 (positivo) 4.473684 13.4%
Porcentaje de Palabras encontradas por tipo de sentimiento ( bing ) 10.1%
sentiment Porcentaje
positive 51.72%
negative 48.28%
Porcentaje de Palabras encontradas por tipo de sentimiento ( nrc ) 12.8%
sentiment Porcentaje
anticipation 18.1%
negative 16.7%
positive 15.3%
trust 12.5%
joy 11.1%
anger 6.9%
sadness 6.9%
surprise 6.9%
disgust 4.2%
fear 1.4%
Porcentaje de Palabras encontradas por tipo de sentimiento ( loughran ) 3.91%
sentiment Porcentaje
positive 40%
negative 30%
uncertainty 20%
superfluous 10%

[1] “Analisis de Sentimientos del Personaje: SWANNEY” [1] “Numero total de Palabras Unicas en el texto: 142”

Escala de Sentimientos entre negativos y positivos: afinn
Descripcion Score % Founded Words
Entre 0 (negativo) y 10 (positivo) 5.153846 7.75%
Porcentaje de Palabras encontradas por tipo de sentimiento ( bing ) 8.45%
sentiment Porcentaje
positive 57.1%
negative 42.9%
Porcentaje de Palabras encontradas por tipo de sentimiento ( nrc ) 14.8%
sentiment Porcentaje
positive 24.6%
trust 20.0%
anticipation 13.8%
joy 10.8%
fear 7.7%
negative 7.7%
surprise 6.2%
sadness 4.6%
anger 3.1%
disgust 1.5%
Porcentaje de Palabras encontradas por tipo de sentimiento ( loughran ) 3.52%
sentiment Porcentaje
negative 50.0%
constraining 16.7%
litigious 16.7%
uncertainty 16.7%

[1] “Analisis de Sentimientos del Personaje: MOTHER” [1] “Numero total de Palabras Unicas en el texto: 112”

Escala de Sentimientos entre negativos y positivos: afinn
Descripcion Score % Founded Words
Entre 0 (negativo) y 10 (positivo) 4.125 7.14%
Porcentaje de Palabras encontradas por tipo de sentimiento ( bing ) 6.25%
sentiment Porcentaje
negative 77.8%
positive 22.2%
Porcentaje de Palabras encontradas por tipo de sentimiento ( nrc ) 11.6%
sentiment Porcentaje
negative 20.0%
sadness 17.1%
fear 14.3%
positive 14.3%
anticipation 8.6%
joy 8.6%
trust 8.6%
anger 2.9%
disgust 2.9%
surprise 2.9%
Porcentaje de Palabras encontradas por tipo de sentimiento ( loughran ) 4.46%
sentiment Porcentaje
negative 85.7%
positive 14.3%

[1] “Analisis de Sentimientos del Personaje: GAIL” [1] “Numero total de Palabras Unicas en el texto: 65”

Escala de Sentimientos entre negativos y positivos: afinn
Descripcion Score % Founded Words
Entre 0 (negativo) y 10 (positivo) 4.375 12.3%
Porcentaje de Palabras encontradas por tipo de sentimiento ( bing ) 13.8%
sentiment Porcentaje
negative 55.6%
positive 44.4%
Porcentaje de Palabras encontradas por tipo de sentimiento ( nrc ) 13.8%
sentiment Porcentaje
negative 19.2%
positive 19.2%
anticipation 15.4%
joy 15.4%
trust 15.4%
fear 7.7%
sadness 3.8%
surprise 3.8%
Porcentaje de Palabras encontradas por tipo de sentimiento ( loughran ) 4.62%
sentiment Porcentaje
negative 33.3%
positive 33.3%
uncertainty 33.3%

[1] “Analisis de Sentimientos del Personaje: MAN” [1] “Numero total de Palabras Unicas en el texto: 92”

Escala de Sentimientos entre negativos y positivos: afinn
Descripcion Score % Founded Words
Entre 0 (negativo) y 10 (positivo) 4.727273 7.61%
Porcentaje de Palabras encontradas por tipo de sentimiento ( bing ) 10.9%
sentiment Porcentaje
negative 53.33%
positive 46.67%
Porcentaje de Palabras encontradas por tipo de sentimiento ( nrc ) 7.61%
sentiment Porcentaje
positive 52.9%
trust 29.4%
joy 11.8%
sadness 5.9%
Porcentaje de Palabras encontradas por tipo de sentimiento ( loughran ) 3.26%
sentiment Porcentaje
positive 100%

Score por Personaje en el tiempo

Top 10 Personajes

Dialogos cúspide por Top 10 Personajes: trainspotting
Personaje Min_Max Dialogo
RENTON MIN I want a fucking hit.
RENTON MAX Sounds great, Swanney.
SICK BOY MIN Fuck you. OK, so Tommy’s got the virus. Bad news, big deal. The gig goes on, or hadn’t you noticed? Swanney fucks his leg up. Well, tough shit, but it could have been worse.
SICK BOY MAX No, it’s not bad, but it’s not great either, is it? And in your heart you kind of know that although it sounds all right, it’s actually just shite.
BEGBIE MIN Because I fucking told you to do that, you doss cunt.
BEGBIE MAX I’m no a fucking buftie and that’s the end of it.
SPUD MIN A little dab of speed is just the ticket.
SPUD MAX The pleasure was mine. Best interview I’ve ever been to. Thanks.
DIANE MIN Shut up.
DIANE MAX It’s where I live.
TOMMY MIN Well, what are you waiting for?
TOMMY MAX Thanks, Mark.
SWANNEY MIN Well, it’s up to you.
SWANNEY MAX you’ll need one more hit.
MOTHER MIN No problem for me either. Honestly, it’s no problem.
MOTHER MAX No. No clinics, no methadone. That made you worse, you said so yourself. You lied to us, son, your own mother and father.
GAIL MIN Not much.
GAIL MAX It’s all right. I slept fine on the sofa.
MAN MIN And who the fuck do you think you are?
MAN MAX But it’s not worth more than fifteen.

Top 4 Parejas

Dialogos cúspide por Top 4 Parejas: trainspotting
Parejas Min_Max Dialogo
RENTON_SICK BOY MIN And I got a stitch stuck between my teeth, jerked my head back and the whole fucking stump fell off.
RENTON_SICK BOY MAX Despite the Academy award?
SICK BOY_BEGBIE MIN Because I fucking told you to do that, you doss cunt.
SICK BOY_BEGBIE MAX We’re two thousand short.
RENTON_BEGBIE MIN You’re not going to and fucking hospital. You’re staying there. And you bring me a fucking cigarette.
RENTON_BEGBIE MAX But you don’t have the money?
RENTON_SPUD MIN Sorry, boys, I don’t have two thousand pounds.
RENTON_SPUD MAX Right.

Reglas de Asociación entre palabras (Market Basket)

Toda la pelicula

## [1] "Lift Promedio de las Reglas de Asociacion: 8.02944052745034"
## [1] "Desviación estandar del Lift de las Reglas de Asociacion: 9.87228101321947"
## [1] "Deciles del Lift : "
##       10%       20%       30%       40%       50%       60%       70% 
##  1.363388  1.851577  2.434146  3.205523  3.929134  5.280423  7.920635 
##       80%       90%      100% 
## 10.915625 21.695652 83.166667

Datos del Histograma: Lift Pelicula: SICK BOY
Numero de Dialogos Lift Minimo Lift Maximo
1,386 -1 1
4,064 1 4
1,576 4 7
510 7 10
948 10 13
118 13 16
## [1] "Leverage Promedio de las Reglas de Asociacion: 0.01313658802416"
## [1] "Desviación estandar del Leverage de las Reglas de Asociacion: 0.0103336215657906"
## [1] "Deciles del Leverage : "
##         10%         20%         30%         40%         50%         60% 
## 0.003764242 0.006381500 0.007634507 0.008995948 0.009718836 0.011534090 
##         70%         80%         90%        100% 
## 0.013985486 0.021457745 0.024851306 0.103859021

Datos del Histograma: Leverage pelicula: SICK BOY
Numero de Dialogos Leverage Minimo Leverage Maximo
314 -0.0018 0.0018
1,216 0.0018 0.0054
2,494 0.0054 0.009
2,494 0.009 0.013
1,044 0.013 0.016
472 0.016 0.02

Top 10 Personajes

Top 4 Parejas

Analisis de Relaciones entre Personajes (Pagerank)

Pagerank: Reservoir Dogs.

Pagerank: Reservoir Dogs.